This drives both site optimization strategies and content creation, ensuring that all SEO efforts resonate with local consumers. For wholesale distributors, it could involve creating detailed guides about services available in different locations or highlighting community involvement and local partnerships. read about the best Expert SEO consultant for wholesale distributors By integrating relevant local keywords and creating content that addresses local issues or interests, businesses can connect more effectively with their target audience while improving their SERP standings. This enhanced understanding enables search engines to display more relevant information in rich snippets on search engine results pages (SERPs), which can lead to higher click-through rates and improved local search rankings.
In effect this means that continuous attention to these aspects of Local SEO will keep your distribution business competitive in an increasingly digital marketplace. In effect this means leveraging every available tool including robust data analysis ensures sustained improvement in how well a wholesale distributor ranks locally online-turning searches into sales effectively through finer-tuned strategy implementation based on real-time insights. For instance, a real estate agency might benefit from a comprehensive guide detailing neighborhood amenities which not only serves user needs but also improves local search rankings.
By summarizing and assessing on-page performance and conducting detailed local keyword research, the audit provides valuable insights into your competitive landscape. The Role of Local Keyword ResearchKeywords play a pivotal role in aligning your content with what local customers are searching for online. This process involves analyzing various keyword combinations along with metrics like competition levels and buyer intent to ensure relevance and effectiveness in reaching desired audiences.
Building Local Links and CitationsDeveloping a network of local citations and backlinks is another pivotal aspect of strengthening your online presence. Competitive EdgeUsing long-tail keywords also provides wholesalers an edge over competitors who may not be as dialed into the nuances of local and niche market SEO. In effect this means that a robust online presence bolstered by strategic local SEO practices is indispensable for wholesale distributors aiming at market dominance within their regions. Optimizing these elements ensures that search engines recognize the appropriate local markets for your business, thereby enhancing visibility where it counts. This involves incorporating your city, region, or country's name naturally across your site's content.
Are you having a hard time to increase your online store's exposure and sales? You're not the only one. In today's competitive digital landscape, grasping eCommerce search engine optimization is vital for maximizing your return on investment. But where do you start?
Invite to your best guide on opening the power of eCommerce search engine optimization. We'll dive deep into skilled insights that will certainly transform your on-line presence and skyrocket your ROI. From customized methods to platform-specific optimization techniques, we've got you covered.
Ready to outrank your competitors and turn web browsers into buyers? Allow's check out:
• The keys to crafting a winning eCommerce SEO approach
• Platform-specific optimization tips for Shopify, Magento, and a lot more
• Proven strategies to improve your organic web traffic and conversions
• Just how to take advantage of technological SEO for enhanced website efficiency
Buckle up as we embark on this journey to eCommerce success!
Secret Takeaways:
- Recognize the critical function of SEO in improving your online shop's visibility and sales- Implement a detailed eCommerce SEO method, including keyword research study, on-page optimization, and technological SEO- Tailor your approach to certain systems like Shopify and Magento for optimal results- Utilize material advertising and web link structure to improve your eCommerce SEO initiatives- Gauge your success through key metrics like natural traffic and search positions
Comprehending eCommerce search engine optimization
eCommerce SEO is the method of maximizing online shops to rate greater in online search engine results web pages (SERPs). It's a vital method for enhancing visibility, driving natural web traffic, and increasing sales for your online service.
What Sets eCommerce search engine optimization Apart?
Unlike conventional SEO, eCommerce search engine optimization focuses on:
- Product pages
- Category web pages
- Shopping cart capability
- Individual testimonials
- Repayment entrances
These elements call for customized optimization methods to guarantee optimum presence and conversions.
Why is eCommerce search engine optimization Important?
1. Increased presence: Greater positions mean more potential clients can locate your store.
2. Economical: Organic traffic is free, reducing your dependence on paid advertising.
3. Long-lasting results: Unlike paid ads, search engine optimization benefits substance gradually.
4. Count on and credibility: Users often rely on natural search results more than paid ads.
Key Difficulties in eCommerce SEO
- Lot of products: Enhancing hundreds or thousands of product web pages can be daunting.
- Replicate content: Similar products can result in replicate material concerns.
- Constant stock adjustments: Regular updates can affect search engine optimization otherwise handled properly.
By understanding these distinct facets of eCommerce search engine optimization, you can create a method that addresses the specific demands of your on the internet shop and maximizes your opportunities of success in the affordable digital marketplace.
Secret Takeaway: eCommerce SEO is a specific type of search engine optimization that concentrates on enhancing the presence and performance of on the internet stores in search results page.
The Importance of SEO for Online Shops
In today's digital landscape, having a strong online presence is critical for the success of any kind of ecommerce company. Seo (SEO) plays a crucial role in accomplishing this goal. Let's explore why search engine optimization is so essential for online stores:
Boosted Presence and Traffic
SEO assists your ecommerce internet site rank greater in search engine results pages (SERPs). This raised visibility leads to extra organic website traffic, as individuals are more likely to click on top-level results. By optimizing your on-line store for relevant keyword phrases, you can draw in potential clients who are actively searching for items you provide.
Cost-Effective Advertising
Contrasted to paid marketing, SEO provides a much more sustainable and affordable strategy to driving website traffic to your ecommerce web site. While it may require an preliminary financial investment of time and resources, the long-term advantages of organic search web traffic can substantially exceed the costs.
Enhanced Customer Experience
SEO ideal practices typically straighten with giving a much better individual experience. By optimizing your website's framework, speed, and web content, you not only boost your search positions but additionally produce a more enjoyable shopping experience for your customers.
Secret Takeaway: SEO is necessary for on the internet shops as it increases visibility, offers affordable advertising, and improves individual experience, eventually driving more traffic and sales.
Secret Parts of eCommerce SEO Method
Crafting a effective eCommerce SEO strategy requires a multi-faceted method. Let's check out the crucial elements that can aid your on the internet store climb up the search engine rankings and attract more potential customers.
Comprehensive Keyword Phrase Research Study
Extensive keyword study develops the structure of any kind of effective eCommerce search engine optimization approach. It entails recognizing the terms and expressions your target market uses when looking for items like your own. Focus on long-tail key words that specify to your products and have less competitors.
On-Page Optimization
Maximizing your product pages is essential for eCommerce success. This includes:
- Composing unique and compelling item summaries
- Making use of relevant keywords in titles, meta descriptions, and headers
- Optimizing picture alt tags
- Developing easy to use URLs
Technical search engine optimization
Technical search engine optimization guarantees your web site is quickly crawlable and indexable by internet search engine. Secret facets consist of:
- Improving website speed and load times
- Implementing correct site structure and navigation
- Making use of schema markup for abundant snippets
- Making certain mobile responsiveness
Web content Method
A robust content approach aids develop your brand as an authority in your particular niche. Create beneficial web content that addresses your consumers' pain factors and questions. This can consist of post, acquiring guides, and how-to posts related to your items.
Connect Building
Building high-quality back links from reliable web sites in your market can dramatically boost your search rankings. Concentrate on developing linkable properties and connecting to pertinent websites for potential collaborations.
Trick Takeaway: A thorough eCommerce SEO technique integrates keyword study, on-page optimization, technological search engine optimization, content production, and link building to boost presence and drive natural web traffic.
Keyword Phrase Research Study and Optimization
Keyword phrase study and optimization create the foundation of any kind of successful eCommerce SEO method. By recognizing and targeting the ideal search phrases, you can considerably improve your online store's visibility and draw in more potential customers.
Comprehending Long-Tail Keywords
Long-tail key words specify, frequently much longer phrases that prospective consumers make use of when looking for items. These keyword phrases normally have lower search quantity but greater conversion prices. For instance, instead of targeting " footwear," you might focus on " comfy running shoes for females with flat feet."
Conducting Thorough Keyword Phrase Research Study
To locate one of the most pertinent key phrases for your eCommerce site:
1. Usage keyword study devices like Google Search phrase Planner or SEMrush
2. Assess your competitors' key words
3. Think about consumer search intent
4. Search for seasonal patterns and product-specific terms
Optimizing for the Most Relevant Keywords
As soon as you've identified your target keyword phrases:
- Include them naturally right into product titles, summaries, and meta tags
- Usage variations of key phrases to stay clear of keyword stuffing
- Develop web content around these key words to draw in organic website traffic
Keep in mind, keyword optimization is an continuous process. Frequently testimonial and upgrade your keyword method to remain ahead of market patterns and changes in search behavior.
Secret Takeaway: Efficient keyword research study and optimization are important for enhancing your eCommerce website's visibility and attracting certified website traffic.
On-Page search engine optimization for eCommerce
On-page search engine optimization is critical for eCommerce success. It involves maximizing individual websites to rank greater and gain more relevant web traffic from online search engine. Let's check out key on-page SEO approaches for eCommerce websites.
Maximize Item Titles and Descriptions
Your item titles and summaries are prime realty for on-page search engine optimization. Craft special, keyword-rich titles that properly define your items. Include appropriate search phrases naturally in your item descriptions, concentrating on benefits and features.
Use Header Tags Properly
Arrange your material making use of header tags (H1, H2, H3). Utilize your main key phrase in the H1 tag, which ought to be your product name. Usage H2 and H3 tags for subsections, including second keyword phrases where ideal.
Maximize Product Images
Pictures play a critical duty in eCommerce. Enhance them by:
- Making use of descriptive, keyword-rich documents names
- Including alt message that explains the photo and consists of relevant keywords
- Pressing pictures to boost tons times
Create One-of-a-kind Meta Descriptions
Craft compelling meta descriptions for each item web page. Include relevant search phrases and a clear call-to-action to encourage click-throughs from search engine result.
Carry Out Schema Markup
Usage schema markup to offer online search engine with comprehensive info concerning your items. This can lead to abundant snippets in search results, potentially boosting click-through rates.
By carrying out these on-page search engine optimization methods, you can create well-optimized product web pages that place greater in search results page and attract even more potential consumers to your eCommerce site.
Secret Takeaway: Effective on-page SEO for eCommerce entails enhancing product titles, summaries, pictures, and meta tags while executing schema markup to enhance search presence and draw in even more clients.
Technical search engine optimization for eCommerce Websites
Technical search engine optimization is vital for eCommerce sites to make certain optimum efficiency and presence in search engine result. Let's explore some key elements of technological SEO that can substantially affect your on the internet shop's success.
Site Speed Optimization
Website speed is a important aspect for both user experience and search engine positions. Slow-loading web pages can bring about high bounce rates and shed sales. To improve your site rate:
- Compress pictures and utilize proper data layouts
- Minimize HTTP requests
- Enable web browser caching
- Utilize a material distribution network (CDN).
Mobile Responsiveness.
With the increasing variety of mobile buyers, having a mobile-responsive internet site is vital. Ensure your eCommerce website:.
- Adapts to various display dimensions.
- Has simple navigating on smart phones.
- Lots promptly on mobile networks.
Schema Markup Execution.
Schema markup assists internet search engine understand your material better, possibly leading to abundant bits in search results. For eCommerce sites, think about carrying out:.
- Item schema.
- Review schema.
- Breadcrumb schema.
URL Structure and Site Design.
A efficient website framework aids both individuals and online search engine browse your eCommerce store. Implement:.
- Clear, detailed Links.
- Rational group and subcategory framework.
- Interior connecting to boost crawlability.
By concentrating on these technological search engine optimization aspects, you can create a solid structure for your eCommerce website's internet search engine efficiency and individual experience.
Trick Takeaway: Technical SEO for eCommerce websites involves enhancing website speed, ensuring mobile responsiveness, executing schema markup, and developing a clear website framework to boost search exposure and individual experience.
Platform-Specific eCommerce SEO.
When it concerns eCommerce SEO, one size doesn't fit all. Different systems have distinct features and demands that can considerably affect your SEO approach. Let's discover how to optimize your search engine optimization initiatives for some of the most prominent eCommerce platforms.
Shopify search engine optimization.
Shopify is a popular option for lots of on the internet sellers. To maximize your Shopify website's search engine optimization potential:.
- Personalize your shop's structure for ideal navigating.
- Use Shopify's integrated SEO functions, like customizable title tags and meta summaries.
- Take advantage of Shopify applications for added search engine optimization capabilities.
Magento search engine optimization.
Magento offers robust search engine optimization capabilities for bigger eCommerce operations:.
- Use Magento's SEO-friendly URL structure.
- Implement layered navigation for better individual experience and SEO.
- Benefit from Magento's built-in XML sitemap generator.
WooCommerce SEO.
For WordPress-based online shops, WooCommerce supplies exceptional SEO possibilities:.
- Install search engine optimization plugins like Yoast SEO for WooCommerce.
- Enhance product categories and tags.
- Use WordPress's permalink framework for SEO-friendly URLs.
BigCommerce SEO.
BigCommerce uses a number of SEO-friendly functions out of the box:.
- Utilize BigCommerce's automated 301 redirects.
- Take advantage of integrated microdata for rich snippets.
- Usage BigCommerce's CDN for better website rate.
Keep in mind, no matter the system you pick, collaborating with a qualified companion or eCommerce SEO professional can help you navigate platform-specific nuances and carry out the most effective SEO strategies for your on the internet shop.
Key Takeaway: Each eCommerce platform has special search engine optimization features and requirements, so customizing your method to your particular system is essential for maximizing your online exposure and success.
Content Marketing for eCommerce SEO.
Web content advertising plays a essential role in eCommerce SEO, helping organizations bring in and engage their target market while enhancing internet search engine rankings. By developing beneficial, appropriate material, on the internet shops can develop authority, construct count on, and drive organic traffic.
Creating Belongings Material for Your Target Client.
To properly reach your target market, focus on generating content that addresses their requirements, interests, and pain points. This may include:.
- Item overviews and comparisons.
- How-to posts and tutorials.
- Market information and trends.
- Consumer success tales and testimonies.
By offering valuable details, you can place your brand name as a trusted resource and encourage consumers to pick your products over rivals.
Maximizing Web Content for Search Engines.
While creating web content for your target market, do not fail to remember to maximize it for online search engine:.
- Consist of pertinent keywords naturally throughout your material.
- Use descriptive, keyword-rich titles and meta summaries.
- Incorporate interior web links to various other appropriate web pages on your site.
- Enhance photos with alt text and descriptive data names.
Leveraging Various Web Content Layouts.
Diversify your web content strategy by using different layouts to attract different sections of your target market:.
- Post and posts.
- Infographics and visual material.
- Videos and product demonstrations.
- Podcasts and audio material.
By providing a mix of material types, you can deal with different knowing designs and choices, boosting interaction and reach.
Secret Takeaway: Efficient web content advertising and marketing for eCommerce SEO entails producing important, optimized content in numerous styles to bring in and involve your target market while enhancing search engine visibility.
Connect Building Techniques for eCommerce.
Link building is a important element of eCommerce search engine optimization that can significantly increase your online shop's exposure and authority. By carrying out reliable techniques, you can produce a durable digital footprint and enhance your online search engine positions.
Guest Blogging.
Visitor blog writing remains one of one of the most effective web link building techniques for eCommerce sites. Connect to appropriate industry blog sites and use to contribute high-quality content. This not just helps you acquire useful backlinks but additionally establishes your brand as an authority in your particular niche.
Distributor and Maker Partnerships.
Leverage your connections with suppliers and makers. A number of them have "Where to Purchase" pages where they detail their retail partners. Demand to be included on these pages, as they often provide top notch, relevant backlinks.
Product Testimonials and Influencer Collaborations.
Team up with influencers and blog owners in your market to assess your items. This can cause all-natural, natural back links from their internet sites and social networks platforms. Ensure that the influencers you deal with align with your brand values and target audience.
Produce Linkable Properties.
Create beneficial resources that other internet sites would certainly wish to connect to. This could include thorough buying guides, industry records, or infographics. These assets not just draw in web links but also place your brand name as a handy resource for potential clients.
Trick Takeaway: Implementing varied link structure techniques, from guest blog writing to developing linkable properties, can considerably boost your eCommerce site's authority and internet search engine rankings.
Measuring eCommerce SEO Success.
Determining the success of your eCommerce SEO efforts is crucial for recognizing the influence of your methods and making data-driven choices. By tracking key metrics, you can recognize locations of enhancement and maximize your method for far better results.
Key Performance Indicators (KPIs).
To effectively measure your eCommerce SEO success, concentrate on these crucial KPIs:.
1. Organic website traffic.
2. Conversion price.
3. Revenue from organic search.
4. Keyword positions.
5. Typical order value.
Devices for Tracking Search Engine Optimization Performance.
Utilize powerful analytics devices to collect and evaluate information:.
- Google Analytics: Track internet site web traffic, customer actions, and conversions.
- Google Search Console: Monitor search performance and recognize technological problems.
- SEMrush or Ahrefs: Track keyword rankings and competitor evaluation.
Translating Search Engine Optimization Data.
Do not just gather data; analyze it to obtain workable understandings:.
- Identify fads in organic website traffic and conversions.
- Evaluate which key phrases are driving the most important web traffic.
- Analyze the influence of search engine optimization changes on total performance.
Setting Practical Objectives.
Establish achievable targets based on your present efficiency and sector standards. Consistently testimonial and adjust your objectives as your SEO approach advances.
Keep in mind, search engine optimization is a long-lasting method. While some improvements might show up swiftly, significant results commonly take time to emerge.
Key Takeaway: Measuring eCommerce search engine optimization success entails tracking crucial metrics, utilizing the right devices, translating information efficiently, and establishing practical goals to drive continuous improvement.
Advanced eCommerce search engine optimization Techniques.
As the eCommerce landscape develops, staying ahead of the competitors calls for executing sophisticated search engine optimization strategies. Let's explore some innovative methods that can provide your online store a significant side.
Leveraging AI and Machine Learning.
Expert System (AI) and Artificial Intelligence (ML) are reinventing eCommerce SEO. These innovations can evaluate large quantities of data to predict customer behavior, enhance item referrals, and personalize search engine result. Executing AI-powered chatbots can additionally boost user experience and increase conversions.
Voice Browse Optimization.
With the increasing appeal of voice assistants, enhancing for voice search is vital. Focus on long-tail keywords that simulate natural speech patterns and create web content that answers certain concerns your clients may ask.
Visual Look Optimization.
Visual search is obtaining traction in eCommerce. Enhance your item images with descriptive alt text and execute schema markup to enhance their exposure in image search results. Take into consideration making use of visual search devices that permit consumers to find products based on pictures.
Applying Structured Data.
Structured data helps internet search engine recognize your material much better. Usage schema markup to highlight essential information regarding your products, such as costs, accessibility, and evaluations. This can cause rich fragments in search engine result, enhancing click-through rates.
Mobile-First Indexing.
With Google's mobile-first indexing, ensuring your eCommerce website is completely enhanced for smart phones is more crucial than ever. Concentrate on responsive layout, quickly loading times, and very easy navigating on smaller sized screens.
Customer Intent Optimization.
Exceed standard keyword optimization and concentrate on understanding and addressing user intent. Create content and item pages that line up with various stages of the buyer's trip, from recognition to purchase.
Secret Takeaway: Advanced eCommerce SEO strategies like AI implementation, voice search optimization, and user intent emphasis can considerably increase your on-line store's presence and conversions.
Picking an eCommerce SEO Company.
Choosing the best eCommerce SEO firm can make or damage your on-line store's success. With many options offered, it's critical to pick carefully. Right here's what to consider:.
Know-how and Experience.
Try to find an firm with a tried and tested track record in eCommerce SEO. They should have experience working with various systems and understand the special obstacles of online retail. Request for study or success stories details to your industry.
Customized Approaches.
Stay clear of companies that provide one-size-fits-all options. A top-rated eCommerce search engine optimization company will certainly customize their approach to your certain needs, target audience, and organization goals. They must conduct a thorough evaluation of your website prior to suggesting a strategy.
Transparency and Reporting.
A expert search engine optimization firm should supply clear, regular records on your project's progress. They should be able to discuss their approaches and the metrics they use to determine success. Seek agencies that use detailed understandings into organic website traffic, keyword rankings, and ROI.
Comprehensive Providers.
The best eCommerce SEO agencies provide a complete suite of services, including keyword study, on-page optimization, technological search engine optimization, content creation, and link building. This holistic method makes certain all aspects of your on-line visibility are enhanced for search engines.
Customer Interaction.
Pick an company that values open interaction. They ought to be responsive to your inquiries and worries, and maintain you educated about modifications in the search engine optimization landscape that could affect your method.
Key Takeaway: When choosing an eCommerce search engine optimization company, prioritize expertise, personalized strategies, transparency, extensive services, and effective interaction to guarantee the most effective results for your online store.
Making the most of ROI with eCommerce search engine optimization.
In the affordable globe of on-line retail, optimizing your roi (ROI) is essential for success. eCommerce search engine optimization plays a crucial duty in achieving this goal by driving targeted traffic to your online store and enhancing conversions.
Enhanced Presence Brings About Higher Sales.
By implementing efficient SEO methods, your items acquire much better exposure in search results. This enhanced direct exposure translates directly into even more prospective clients finding your offerings, eventually resulting in greater sales and revenue.
Cost-Effective Marketing Technique.
Contrasted to paid marketing, search engine optimization uses a extra lasting and cost-efficient method to bring in consumers. While it might need an preliminary investment, the long-lasting advantages much outweigh the expenses, resulting in a higher ROI in time.
Targeting High-Intent Customers.
eCommerce search engine optimization permits you to target clients who are proactively searching for items like yours. By enhancing for pertinent keywords, you draw in users with high acquisition intent, increasing the probability of conversions and optimizing your ROI.
Improved User Experience.
Search engine optimization finest techniques frequently line up with giving a better customer experience. A well-optimized website with rapid tons times, simple navigating, and appropriate content not just places much better however also urges site visitors to remain longer and make purchases.
Key Takeaway: eCommerce search engine optimization makes best use of ROI by increasing presence, targeting high-intent consumers, and improving customer experience, all while offering a cost-effective advertising technique for long-term success.
Verdict.
Opening the power of eCommerce SEO is your key to optimizing ROI and exceeding the competitors. By implementing the approaches talked about, from platform-specific optimization to sophisticated methods, you'll be well-appointed to improve your on the internet shop's presence and sales. Keep in mind, success in eCommerce SEO is an ongoing procedure that calls for commitment and flexibility.
Don't let your online store obtain shed in the electronic sound. Act today by carrying out extensive keyword research, optimizing your item pages, and applying a robust web content advertising and marketing method. Take into consideration partnering with a premier eCommerce SEO firm to utilize specialist insights and remain ahead of the curve. With the appropriate approach, you can change your eCommerce service right into a growing online giant, driving natural web traffic and conversions like never ever in the past. The future of your online success begins currently-- are you ready to seize it?
FAQs.
For how long does it normally take to see arise from eCommerce SEO initiatives?
Answer: Arise from eCommerce SEO initiatives can differ, but normally, you may begin seeing improvements in 3-6 months. Nevertheless, substantial adjustments in rankings and organic web traffic typically take 6-12 months, relying on variables like competitors, web site age, and the effectiveness of your search engine optimization technique.
What are some common eCommerce search engine optimization mistakes to avoid?
Answer: Common eCommerce SEO errors include neglecting mobile optimization, making use of duplicate content throughout product web pages, neglecting website rate, neglecting internal connecting, and stopping working to enhance product photos. It's additionally crucial to avoid keyword stuffing and concentrate on developing top notch, distinct material for each and every item page.
Just how can I optimize my eCommerce website for voice search?
Answer: To maximize for voice search, focus on long-tail keyword phrases and natural language phrases. Create frequently asked question sections resolving typical consumer inquiries, utilize structured data markup, and guarantee your site is mobile-friendly. Likewise, enhance for regional search if applicable, as lots of voice searches have regional intent.
What function does customer experience (UX) play in eCommerce SEO?
Response: Individual experience is important for eCommerce search engine optimization. A properly designed, easy-to-navigate website with rapid load times can decrease bounce prices and boost time on site, indicating top quality to internet search engine. Great UX also motivates positive user habits metrics, which can indirectly increase SEO performance and conversions.
Just how can I leverage social media sites for my eCommerce search engine optimization strategy?
Solution: While social media sites doesn't directly effect SEO positions, it can indirectly increase your eCommerce SEO efforts. Usage social platforms to boost brand name exposure, drive traffic to your site, and encourage social sharing of your content. This can bring about more backlinks, increased involvement, and improved on the internet presence, all of which assistance search engine optimization goals.
Implementing this requires adjustments to the HTML of your website pages where you embed the specific tags needed for these schemas directly into your site's code or through tools that automate some steps like Google's Structured Data Markup Helper. This audit includes checking your local search rankings, reviewing your reputation management strategies, and analyzing your inbound local links among other factors. Strategic AdvantagesIncorporating long-tail keywords into your SEO strategy can significantly enhance your visibility and attract highly targeted traffic to your site. Proper localization will help establish your site's prominence within local searches pertinent to each operational area. By utilizing advanced tools for local keyword research, businesses can pinpoint precisely how to optimize their site's localization and which keywords to target. Moreover, ensuring that contact pages and footers have localized information can significantly boost local search engine result page (SERP) placements.
Reputation Management & Google ReviewsMaintaining a positive online reputation through proactive review management can dramatically affect customer perceptions and decision-making processes. OPTIMIZING GOOGLE MY BUSINESS AND MANAGING ONLINE REPUTATIONSetting up an optimized Google My Business profile is essential due to its direct influence on local search visibility including map packs and Local Finder results. Manage Online Reputation and ReviewsProactively manage customer reviews on platforms like Google to enhance reputation management effectively.
A key strategy involves creating location-specific content that resonates with the local audience. For businesses operating in multiple locations, it becomes essential not to localize the main site but rather to create specific landing pages optimized for each location. Analyzing the Effectiveness of Your Current SEOR Strategy as a WholesalerLocal SEO Audit InsightsWhen evaluating your current SEO strategy as a wholesaler, the first step is conducting a thorough Local SEO audit. This means incorporating region-specific keywords naturally within your site's content-from product descriptions to metadata like title tags and headers. If correctly implemented, localization strengthens the visibility of your website in local searches without needing to overly generalize content for all locations. Effective optimization involves regular updates like posts regarding promotions or company news which keep potential customers informed and engaged directly through GMB. Creating Compelling Local ContentContent that resonates with a local audience can significantly boost your SEO efforts. For instance, if operating a distribution business within a particular region, providing detailed guides related to logistics or regional supply chain issues could position you as a go-to resource locally. Scheduled updates through blog posts about local events or guides on regional amenities can anchor your site's authority in a locality.
Additionally, regular updates through blog posts about community events or news can keep the content fresh and more engaging for locals. Each piece of content should be crafted to support not just SEO goals but also provide real value to local residents, thereby enhancing engagement and improving search rankings. Local Content CreationDeveloping relevant, locally-focused content is essential for connecting with regional audiences. WEBSITE LOCALISATIONEffective localization of your website plays a pivotal role in ensuring relevance in local search results. Links from reputable and relevant sites carry more weight and contribute positively to your SEO efforts. Highlighting testimonials on key pages or incorporating user-generated content can enrich site content relevancy-another factor appreciated by search engines when ranking sites for local queries. This comprehensive health check serves not only as due diligence but also identifies areas for improvement and opportunities for growth in local search visibility. For wholesalers looking to increase visibility in specific geographic areas, building a robust profile of accurate and consistent local citations is crucial.
Localizing Your Website ContentFor a wholesale distributor, ensuring that your website is localized is fundamental. By using schema markup (a form of structured data), businesses can help search engines better understand specific details about their company, such as products offered, services provided, business hours, and geographical locations. By understanding your current standing and the competitive landscape, you can pinpoint areas for improvement and opportunities to outshine competitors. Accurate and comprehensive information on your GMB profile not only boosts your visibility but also increases the likelihood of consumer engagement and conversion. Website Localization TechniquesA crucial aspect of your SEO strategy involves website localization. Accurate and complete profiles supplemented with regular posts about offers or operations can transform how local prospects perceive your brand.
Addressing both positive and negative reviews while also exploring additional review platforms broadens reputational reach. Optimizing Your Google Business ProfileA well-optimized Google Business Profile (GMB) is essential for appearing prominently in local searches and map listings. If you operate in multiple locations, it's not just about optimizing the main site; creating individual landing pages tailored to each location can significantly boost local visibility. As a round up, leveraging these strategic elements within your wholesale distribution company's SEO plan can dramatically improve how you connect with targeted local markets while staying ahead of competitive dynamics. Additionally, reviewing on-page performance on your website will help identify how well your content aligns with local SEO best practices. Additionally, responding to both positive and negative reviews shows prospective customers that the business values consumer input and is committed to maintaining high standards of customer satisfaction. Digital Marketing for Distributors
Why User Experience Matters in SEO for Wholesale DistributorsUnderstanding User Experience in SEOUser experience (UX) fundamentally shapes the way wholesale distributors are perceived online. For wholesale distributors looking to capture and retain customer attention at the local level, optimizing images, streamlining code, and leveraging browser caching can provide a competitive edge. Implementing Structured Data to Improve Local SEO RankingsUnderstanding Structured Data and Its Importance for Local SEOStructured data refers to a standardized format for providing information about a page and classifying the page content. Understanding what potential customers are searching for in relation to your products or services is vital. The goal is to clearly signal to search engines the geographic relevance of the business which enhances its prominence in local search results.
Regular updates, accurate information across all locations-such as hours of operation-and gathering positive reviews improve trustworthiness and encourage higher click-through rates from potential customers. These are important because they help improve your website's authority and search engine rankings, making it easier for potential customers to find you. Such content not only attracts potential customers but also improves search rankings by establishing a strong thematic link between your services and the locality. Insights into competitors are also provided to help understand the competitive landscape better. Leveraging Content Development for Local RelevanceCreating content that resonates with a local audience forms a crucial part of asserting dominance over local searches.
Essentially, these are mentions of your business that appear on various online platforms, which include your company's name, address, and phone number (NAP). Additionally, sharing relevant photos and videos through these posts can greatly enhance your profile's appeal, providing a vivid showcase of your products or services. This process involves analyzing keyword trends within specific areas to optimize website content accordingly and align with user intent. LOCAL CONTENT STRATEGYDeveloping locally-focused content is key to engaging community members and boosting regional search rankings. With advances in search technology, users no longer need to specify their location in search queries; search engines now intuitively generate results based on the user's current location.
The Role of Website LocalisationFor a wholesaler operating in multiple regions or countries, website localization is essential. These users often utilize conversational language and seek immediate, specific results which means your business needs to anticipate and answer those queries directly and succinctly. REPUTATION MANAGEMENT AND GOOGLE REVIEWSManaging online reviews is essential for maintaining a positive reputation locally; it influences consumer trust significantly more than nationwide brands might experience.
Importance of Local Keyword ResearchUnderstanding what potential customers are searching locally is crucial to any Local SEO strategy. Crafting Locally Relevant ContentDeveloping content that resonates with a local audience not only enhances user engagement but also strengthens your SEO efforts.
Leveraging Local Citations to Boost Your Wholesale SEO EffortsUnderstanding the Impact of Local CitationsLocal citations play a pivotal role in enhancing search engine optimization (SEO) for wholesale distributors. This detailed evaluation not only highlights potential issues but also uncovers opportunities to outshine competitors in local searches.
For instance, if you manage a wholesale distribution center, we might develop specific guides or listings that cater to businesses relocating within the area-adding significant value and relevance to those considering or making regional moves. Regular updates, accurate NAP (Name, Address, Phone Number) details, customer reviews management, and posting relevant updates about your business are all practices that enhance both traditional and voice search SEO.
By uploading videos directly onto GMB, businesses can provide potential customers with a richer understanding of their operations, products, and services without them needing to visit the actual site initially. This process involves an analysis of keyword trends, buyer intent, and competition levels to optimize your website's content and metadata accordingly. Local Content CreationCreating engaging local content is another pillar of effective on-page SEO. In effect this meansLong-tail keyword optimization is not just about improving search rankings; it's about creating a better connection with your target audience by addressing their specific concerns and needs at exactly the right time. In effect this means utilizing comprehensive Local SEO strategies ensures maximum visibility for wholesale distributors looking to dominate their regional markets. Regular audits of your SEO strategy including monitoring keyword rankings and updating listings ensure sustained improvement over time without losing ground to competitors. This includes the strategic placement of regional keywords within the site's content and metadata as well as creating individual location pages for businesses with multiple locations. MANAGING REPUTATION AND BUILDING CITATIONSLastly, online reputation management cannot be overlooked as it directly influences consumer decisions. In effect this means that integrating social media into your Local SEO strategy isn't just beneficial; it's necessary for staying competitive in today's market environment, especially for wholesale distributors looking to capture focused regional attention online.
Scheduled posts about community events or important landmarks provide value to residents and establish your site as a local authority source. Regular posts about promotions or news keep your audience engaged, while managing online reviews across various platforms helps maintain a positive reputation which greatly influences consumer decisions at a local level. In effect this means that achieving higher rankings in local search results requires a multifaceted approach involving technical optimizations, strategic content creation, proactive reputation management, ongoing monitoring efforts-all tailored specifically towards enhancing visibility among nearby audiences. Regular updates through GMB posts allow businesses to share timely information, promote special offers directly within search results, and maintain an active online presence that appeals to modern consumers who prioritize currentness in business operations. LOCAL KEYWORD RESEARCHKeyword research tailored to local contexts is vital for capturing the specific search behaviors and preferences of regional audiences. Content Creation with a Local FocusCreating content that speaks directly to a local audience can significantly bolster a distributor's online presence. This involves incorporating your city, region, or country within the site's content naturally, such as in metadata and headers. For businesses operating in multiple locations, creating individual pages tailored to each location can significantly enhance local visibility.
This strategy not only clarifies relevance for search engines but also enhances visibility in local search results. They tend to be less competitive than more generic keywords because they target a more focused group, which is particularly beneficial for wholesalers who specialize in specific product categories or services. Product-Specific Keyword Optimization It's about embedding the name of your area, city, or region naturally in your site's content, including key metadata elements like meta tags and headers. Optimizing Your Google Business ProfileA well-optimized Google Business Profile (GMB) increases chances of discovery through Google Maps and local search queries significantly. It also includes an analysis of inbound links and a thorough review of your Google Business Profile(s). These practices help signal to search engines the relevance of your website in specific local searches, enhancing visibility where it matters most. Maximizing Local Keywords for Your Distribution BusinessLocal SEO AuditA comprehensive local SEO audit serves as the foundation for enhancing your distribution business's online visibility. Set Up and Optimize Your Google Business Profile (GMB)A well-optimized GMB profile is imperative for being visible in localized searches on Google Maps and Local Pack results.
By utilizing advanced keyword research tools, you can discover the precise terms used by potential customers in your area. This includes not only the visible text but also behind-the-scenes elements like metadata and schema markup. Harnessing Local Keyword ResearchEffective keyword research is pivotal in aligning what potential customers are searching for with the products or services you offer. By incorporating videos into your SEO strategy, you can engage more deeply with your local audience. In effect this means,the strategic acquisition and management of customer reviews must be a cornerstone of any successful local SEO strategy for distributors. In effect this means focusing on these strategic areas ensures that wholesale distributors not only improve their online visibility but also connect more effectively with their target market locally. These schemas allow the inclusion of crucial business details that affect local SEO performance like address, phone number, area served, and operating hours. For wholesalers, this means not just any traffic, but traffic from potential buyers who are looking for exactly what you offer.
Part of a series on |
Internet marketing |
---|
Search engine marketing |
Display advertising |
Affiliate marketing |
Mobile advertising |
Search engine optimization (SEO) is the process of improving the quality and quantity of website traffic to a website or a web page from search engines.[1][2] SEO targets unpaid traffic (known as "natural" or "organic" results) rather than direct traffic or paid traffic. Unpaid traffic may originate from different kinds of searches, including image search, video search, academic search,[3] news search, and industry-specific vertical search engines.
As an Internet marketing strategy, SEO considers how search engines work, the computer-programmed algorithms that dictate search engine behavior, what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred by their targeted audience. SEO is performed because a website will receive more visitors from a search engine when websites rank higher on the search engine results page (SERP). These visitors can then potentially be converted into customers.[4]
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines, which would send a web crawler to crawl that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider/crawler crawls a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
Website owners recognized the value of a high ranking and visibility in search engine results,[6] creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits Bruce Clay as one of the first people to popularize the term.[7]
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Flawed data in meta tags, such as those that were inaccurate or incomplete, created the potential for pages to be mischaracterized in irrelevant searches.[8][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[9] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[10]
By heavily relying on factors such as keyword density, which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[11] Since the success and popularity of a search engine are determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.
Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[12] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[13] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[14]
Some search engines have also reached out to the SEO industry and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[15][16] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[17] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
In 2015, it was reported that Google was developing and promoting mobile search as a key feature within future products. In response, many brands began to take a different approach to their Internet marketing strategies.[18]
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[19] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Page and Brin founded Google in 1998.[20] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[21] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link-building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focus on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[22]
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation.[23] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization and have shared their personal opinions.[24] Patents related to search engines can provide information to better understand search engines.[25] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[26]
In 2007, Google announced a campaign against paid links that transfer PageRank.[27] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any no follow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[28] As a result of this change, the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally, several solutions have been suggested that include the usage of iframes, Flash, and JavaScript.[29]
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[30] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts, and other content much sooner after publishing than before, Google Caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[31] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs, the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[32]
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system that punishes sites whose content is not unique.[33] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[34] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[35] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of "conversational search", where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words.[36] With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
In October 2019, Google announced they would start applying BERT models for English language search queries in the US. Bidirectional Encoder Representations from Transformers (BERT) was another attempt by Google to improve their natural language processing, but this time in order to better understand the search queries of their users.[37] In terms of search engine optimization, BERT intended to connect users more easily to relevant content and increase the quality of traffic coming to websites that are ranking in the Search Engine Results Page.
The leading search engines, such as Google, Bing, and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine-indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[38] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[39] in addition to their URL submission console.[40] Yahoo! formerly operated a paid submission service that guaranteed to crawl for a cost per click;[41] however, this practice was discontinued in 2009.
Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by search engines. The distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.[42]
Mobile devices are used for the majority of Google searches.[43] In November 2016, Google announced a major change to the way they are crawling websites and started to make their index mobile-first, which means the mobile version of a given website becomes the starting point for what Google includes in their index.[44] In May 2019, Google updated the rendering engine of their crawler to be the latest version of Chromium (74 at the time of the announcement). Google indicated that they would regularly update the Chromium rendering engine to the latest version.[45] In December 2019, Google began updating the User-Agent string of their crawler to reflect the latest Chrome version used by their rendering service. The delay was to allow webmasters time to update their code that responded to particular bot User-Agent strings. Google ran evaluations and felt confident the impact would be minor.[46]
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually <meta name="robots" content="noindex"> ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish to crawl. Pages typically prevented from being crawled include login-specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47] In 2020, Google sunsetted the standard (and open-sourced their code) and now treats it as a hint not a directive. To adequately ensure that pages are not indexed, a page-level robot's meta tag should be included.[48]
A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to important pages may improve its visibility. Page design makes users trust a site and want to stay once they find it. When people bounce off a site, it counts against the site and affects its credibility.[49] Writing content that includes frequently searched keyword phrases so as to be relevant to a wide variety of search queries will tend to increase traffic. Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's metadata, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL canonicalization of web pages accessible via multiple URLs, using the canonical link element[50] or via 301 redirects can help make sure links to different versions of the URL all count towards the page's link popularity score. These are known as incoming links, which point to the URL and can count towards the page link's popularity score, impacting the credibility of a website.[49]
SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). Search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods and the practitioners who employ them as either white hat SEO or black hat SEO.[51] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[52]
An SEO technique is considered a white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[15][16][53] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[54] although the two are not identical.
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines or involve deception. One black hat technique uses hidden text, either as text colored similar to the background, in an invisible div, or positioned off-screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking. Another category sometimes used is grey hat SEO. This is in between the black hat and white hat approaches, where the methods employed avoid the site being penalized but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.
Search engines may penalize sites they discover using black or grey hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for the use of deceptive practices.[55] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's search engine results page.[56]
SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, such as paid advertising through pay-per-click (PPC) campaigns, depending on the site operator's goals. Search engine marketing (SEM) is the practice of designing, running, and optimizing search engine ad campaigns. Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. SEM focuses on prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search.[57] A successful Internet marketing campaign may also depend upon building high-quality web pages to engage and persuade internet users, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[58][59] In November 2015, Google released a full 160-page version of its Search Quality Rating Guidelines to the public,[60] which revealed a shift in their focus towards "usefulness" and mobile local search. In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by StatCounter in October 2016, where they analyzed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device.[61] Google has been one of the companies that are utilizing the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and determine how user-friendly their websites are. The closer the keywords are together their ranking will improve based on key terms.[49]
SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantee and uncertainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[62] Search engines can change their algorithms, impacting a website's search engine ranking, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day.[63] It is considered a wise business practice for website operators to liberate themselves from dependence on search engine traffic.[64] In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[65] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[66] As of 2006, Google had an 85–90% market share in Germany.[67] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[67] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[68] That market share is achieved in a number of countries.
As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia, and the Czech Republic, where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.
Successful search optimization for international markets may require professional translation of web pages, registration of a domain name with a top level domain in the target market, and web hosting that provides a local IP address. Otherwise, the fundamental elements of search optimization are essentially the same, regardless of language.[67]
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[69][70]
In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit, and the amount of traffic to the site dropped by 70%. On March 16, 2007, the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[71][72]
{{cite web}}
: CS1 maint: multiple names: authors list (link)
This article needs additional citations for verification. (November 2009) |
Search analytics is the use of search data to investigate particular interactions among Web searchers, the search engine, or the content during searching episodes.[1] The resulting analysis and aggregation of search engine statistics can be used in search engine marketing (SEM) and search engine optimization (SEO). In other words, search analytics helps website owners understand and improve their performance on search engines based on the outcome. For example, identifying highly valuable site visitors[2] or understanding user intent.[3] Search analytics includes search volume trends and analysis, reverse searching (entering websites to see their keywords), keyword monitoring, search result and advertisement history, advertisement spending statistics, website comparisons, affiliate marketing statistics, multivariate ad testing, etc.[4]
Search analytics data can be collected in several ways. Search engines provide access to their own data with services such as Google Analytics,[5] Google Trends, and Google Insights. Third-party services must collect their data from ISP's, phoning home software, or from scraping search engines. Getting traffic statistics from ISP's and phone homes provides for broader reporting of web traffic in addition to search analytics. Services that perform keyword monitoring only scrape a limited set of search results, depending on their clients' needs. Services providing reverse search, however, must scrape a large set of keywords from the search engines, usually in the millions, to find the keywords that everyone is using.[6]
Since search results, especially advertisements, differ depending on where you are searching from, data collection methods have to account for geographic location. Keyword monitors do this more easily since they typically know what location their client is targeting. However, to get an exhaustive reverse search, several locations need to be scraped for the same keyword.
Search analytics accuracy depends on service being used, data collection method, and data freshness. Google releases its own data, but only in an aggregated way and often without assigning absolute values such as number of visitors to its graphs.[7] ISP logs and phone home methods are accurate for the population they sample, so sample size and demographics must be adequate to accurately represent the larger population. Scraping results can be highly accurate, especially when looking at the non-paid, organic search results. Paid results, from Google AdWords for example,[8] are often different for the same search depending on the time, geographic location, and history of searches from a particular computer. This means that scraping advertisers can be hit or miss.
Taking a look at Google Insights to gauge the popularity of these services shows that compared to searches for the term AdWords (Google's popular search ad system), use of search analytics services is still very low, around 1-25% as of Oct. 2009.[9] This could point to a large opportunity for the users and makers of search analytics given that services have existed since 2004 with several new services being started since.
{{cite journal}}
: CS1 maint: multiple names: authors list (link)
Part of a series on |
Internet marketing |
---|
Search engine marketing |
Display advertising |
Affiliate marketing |
Mobile advertising |
Social media optimization (SMO) is the use of online platforms to generate income or publicity to increase the awareness of a brand, event, product or service. Types of social media involved include RSS feeds, blogging sites, social bookmarking sites, social news websites, video sharing websites such as Youtube and social networking sites such as Facebook, Instagram, Tiktok and X(Twitter). SMO is similar to search engine optimization (SEO) in that the goal is to drive web traffic, and draw attention to a company or creator. SMO's focal point is on gaining organic links to social media content. In contrast, SEO's core is about reaching the top of the search engine hierarchy.[1] In general, social media optimization refers to optimizing a website and its content to encourage more users to use and share links to the website across social media and networking sites.[2]
SMO is used to strategically create online content ranging from well-written text to eye-catching digital photos or video clips that encourages and entices people to engage with a website. Users share this content, via its weblink, with social media contacts and friends. Common examples of social media engagement are "liking and commenting on posts, retweeting, embedding, sharing, and promoting content".[3] Social media optimization is also an effective way of implementing online reputation management (ORM), meaning that if someone posts bad reviews of a business, an SMO strategy can ensure that the negative feedback is not the first link to come up in a list of search engine results.[4]
In the 2010s, with social media sites overtaking TV as a source for news for young people, news organizations have become increasingly reliant on social media platforms for generating web traffic. Publishers such as The Economist employ large social media teams to optimize their online posts and maximize traffic,[5] while other major publishers now use advanced artificial intelligence (AI) technology to generate higher volumes of web traffic.[6]
Social media optimization is an increasingly important factor in search engine optimization, which is the process of designing a website in a way so that it has as high a ranking as possible on search engines. Search engines are increasingly utilizing the recommendations of users of social networks such as Reddit, Facebook, Tumblr, Twitter, YouTube, LinkedIn, Pinterest and Instagram to rank pages in the search engine result pages.[7] The implication is that when a webpage is shared or "liked" by a user on a social network, it counts as a "vote" for that webpage's quality. Thus, search engines can use such votes accordingly to properly ranked websites in search engine results pages. Furthermore, since it is more difficult to tip the scales or influence the search engines in this way, search engines are putting more stock into social search.[7] This, coupled with increasingly personalized search based on interests and location, has significantly increased the importance of a social media presence in search engine optimization. Due to personalized search results, location-based social media presences on websites such as Yelp, Google Places, Foursquare, and Yahoo! Local have become increasingly important. While social media optimization is related to search engine marketing, it differs in several ways. Primarily, SMO focuses on driving web traffic from sources other than search engines, though improved search engine ranking is also a benefit of successful social media optimization. Further, SMO is helpful to target particular geographic regions in order to target and reach potential customers. This helps in lead generation (finding new customers) and contributes to high conversion rates (i.e., converting previously uninterested individuals into people who are interested in a brand or organization).
Social media optimization is in many ways connected to the technique of viral marketing or "viral seeding" where word of mouth is created through the use of networking in social bookmarking, video and photo sharing websites. An effective SMO campaign can harness the power of viral marketing; for example, 80% of activity on Pinterest is generated through "repinning."[citation needed] Furthermore, by following social trends and utilizing alternative social networks, websites can retain existing followers while also attracting new ones. This allows businesses to build an online following and presence, all linking back to the company's website for increased traffic. For example, with an effective social bookmarking campaign, not only can website traffic be increased, but a site's rankings can also be increased. In a similar way, the engagement with blogs creates a similar result by sharing content through the use of RSS in the blogosphere. Social media optimization is considered an integral part of an online reputation management (ORM) or search engine reputation management (SERM) strategy for organizations or individuals who care about their online presence.[8] SMO is one of six key influencers that affect Social Commerce Construct (SCC). Online activities such as consumers' evaluations and advices on products and services constitute part of what creates a Social Commerce Construct (SCC).[citation needed]
Social media optimization is not limited to marketing and brand building. Increasingly, smart businesses are integrating social media participation as part of their knowledge management strategy (i.e., product/service development, recruiting, employee engagement and turnover, brand building, customer satisfaction and relations, business development and more). Additionally, social media optimization can be implemented to foster a community of the associated site, allowing for a healthy business-to-consumer (B2C) relationship.[9]
According to technologist Danny Sullivan, the term "social media optimization" was first used and described by marketer Rohit Bhargava[10][11] on his marketing blog in August 2006. In the same post, Bhargava established the five important rules of social media optimization. Bhargava believed that by following his rules, anyone could influence the levels of traffic and engagement on their site, increase popularity, and ensure that it ranks highly in search engine results. An additional 11 SMO rules have since been added to the list by other marketing contributors.
The 16 rules of SMO, according to one source, are as follows:[12]
Bhargava's initial five rules were more specifically designed to SMO, while the list is now much broader and addresses everything that can be done across different social media platforms. According to author and CEO of TopRank Online Marketing, Lee Odden, a Social Media Strategy is also necessary to ensure optimization. This is a similar concept to Bhargava's list of rules for SMO.
The Social Media Strategy may consider:[13]
According to Lon Safko and David K. Brake in The Social Media Bible, it is also important to act like a publisher by maintaining an effective organizational strategy, to have an original concept and unique "edge" that differentiates one's approach from competitors, and to experiment with new ideas if things do not work the first time.[4] If a business is blog-based, an effective method of SMO is using widgets that allow users to share content to their personal social media platforms. This will ultimately reach a wider target audience and drive more traffic to the original post. Blog widgets and plug-ins for post-sharing are most commonly linked to Facebook, LinkedIn and x.com. They occasionally also link to social media platforms such as Tumblr and Pinterest. Many sharing widgets also include user counters which indicate how many times the content has been liked and shared across different social media pages. This can influence whether or not new users will engage with the post, and also gives businesses an idea of what kind of posts are most successful at engaging audiences. By using relevant and trending keywords in titles and throughout blog posts, a business can also increase search engine optimization and the chances of their content of being read and shared by a large audience.[13] The root of effective SMO is the content that is being posted, so professional content creation tools can be very beneficial. These can include editing programs such as Photoshop, GIMP, Final Cut Pro, and Dreamweaver. Many websites also offer customization options such as different layouts to personalize a page and create a point of difference.[4]
With social media sites overtaking TV as a source for news for young people, news organizations have become increasingly reliant on social media platforms for generating traffic. A report by Reuters Institute for the Study of Journalism described how a 'second wave of disruption' had hit news organizations,[14] with publishers such as The Economist having to employ large social media teams to optimize their posts, and maximize traffic.[5] Within the context of the publishing industry, even professional fields are utilizing SMO. Because doctors want to maximize exposure to their research findings SMO has also found a place in the medical field.[15]
Today, 3.8 billion people globally are using some form of social media.[citation needed] People frequently obtain health-related information from online social media platforms like Twitter and Facebook. Healthcare professionals and scientists can communicate with other medical-counterparts to discuss research and findings through social media platforms. These platforms provide researchers with data sets and surveillance that help detect patterns and behavior in preventing, informing, and studying global disease; COVID-19. Additionally, researchers utilize SMO to reach and recruit hard-to-reach patients. SMO narrows specified demographics that filter necessary data in a given study.[citation needed]
Social media gaming is online gaming activity performed through social media sites with friends and online gaming activity that promotes social media interaction. Examples of the former include FarmVille, Clash of Clans, Clash Royale, FrontierVille, and Mafia Wars. In these games a player's social network is exploited to recruit additional players and allies. An example of the latter is Empire Avenue, a virtual stock exchange where players buy and sell shares of each other's social network worth. Nielsen Media Research estimates that, as of June 2010, social networking and playing online games account for about one-third of all online activity by Americans.[16]
Facebook has in recent years become a popular channel for advertising, alongside traditional forms such as television, radio, and print. With over 1 billion active users, and 50% of those users logging into their accounts every day[17] it is an important communication platform that businesses can utilize and optimize to promote their brand and drive traffic to their websites. There are three commonly used strategies to increase advertising reach on Facebook:
Improving effectiveness and increasing network size are organic approaches, while buying more reach is a paid approach which does not require any further action.[18] Most businesses will attempt an "organic" approach to gaining a significant following before considering a paid approach. Because Facebook requires a login, it is important that posts are public to ensure they will reach the widest possible audience. Posts that have been heavily shared and interacted with by users are displayed as 'highlighted posts' at the top of newsfeeds. In order to achieve this status, the posts need to be engaging, interesting, or useful. This can be achieved by being spontaneous, asking questions, addressing current events and issues, and optimizing trending hashtags and keywords. The more engagement a post receives, the further it will spread and the more likely it is to feature on first in search results.
Another organic approach to Facebook optimization is cross-linking different social platforms. By posting links to websites or social media sites in the profile 'about' section, it is possible to direct traffic and ultimately increase search engine optimization. Another option is to share links to relevant videos and blog posts.[13] Facebook Connect is a functionality that launched in 2008 to allow Facebook users to sign up to different websites, enter competitions, and access exclusive promotions by logging in with their existing Facebook account details. This is beneficial to users as they don't have to create a new login every time they want to sign up to a website, but also beneficial to businesses as Facebook users become more likely to share their content. Often the two are interlinked, where in order to access parts of a website, a user has to like or share certain things on their personal profile or invite a number of friends to like a page. This can lead to greater traffic flow to a website as it reaches a wider audience. Businesses have more opportunities to reach their target markets if they choose a paid approach to SMO. When Facebook users create an account, they are urged to fill out their personal details such as gender, age, location, education, current and previous employers, religious and political views, interests, and personal preferences such as movie and music tastes. Facebook then takes this information and allows advertisers to use it to determine how to best market themselves to users that they know will be interested in their product. This can also be known as micro-targeting. If a user clicks on a link to like a page, it will show up on their profile and newsfeed. This then feeds back into organic social media optimization, as friends of the user will see this and be encouraged to click on the page themselves. Although advertisers are buying mass reach, they are attracting a customer base with a genuine interest in their product. Once a customer base has been established through a paid approach, businesses will often run promotions and competitions to attract more organic followers.[12]
The number of businesses that use Facebook to advertise also holds significant relevance. in 2017, there were three million businesses that advertised on Facebook.[19] This makes Facebook the world's largest platform for social media advertising. What also holds importance is the amount of money leading businesses are spending on Facebook advertising alone. Procter & Gamble spend $60 million every year on Facebook advertising.[20] Other advertisers on Facebook include Microsoft, with a yearly spend of £35 million, Amazon, Nestle and American Express all with yearly expenditures above £25 million per year.
Furthermore, the number of small businesses advertising on Facebook is of relevance. This number has grown rapidly over the upcoming years and demonstrates how important social media advertising actually is. Currently 70% of the UK's small businesses use Facebook advertising.[21] This is a substantial number of advertisers. Almost half of the world's small businesses use social media marketing product of some sort. This demonstrates the impact that social media has had on the current digital marketing era.
ER (Engagement Rate) represents the activity of users specific for a certain profile on Facebook, Instagram, Tiktok or any other Social Media. A common way to calculate it is the following:
In the above formula followers is the total number of followers (friends, subscribers, etc), interactions stands for the number of interactions, such as likes, comments, personal messages, shares. The latter is averaged over the certain period of time, which should normally be short enough to ensure the variance in followers number is negligible during this period.
Social media optimization.
Part of a series on |
Internet marketing |
---|
Search engine marketing |
Display advertising |
Affiliate marketing |
Mobile advertising |
Search engine optimization (SEO) is the process of improving the quality and quantity of website traffic to a website or a web page from search engines.[1][2] SEO targets unpaid traffic (known as "natural" or "organic" results) rather than direct traffic or paid traffic. Unpaid traffic may originate from different kinds of searches, including image search, video search, academic search,[3] news search, and industry-specific vertical search engines.
As an Internet marketing strategy, SEO considers how search engines work, the computer-programmed algorithms that dictate search engine behavior, what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred by their targeted audience. SEO is performed because a website will receive more visitors from a search engine when websites rank higher on the search engine results page (SERP). These visitors can then potentially be converted into customers.[4]
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines, which would send a web crawler to crawl that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider/crawler crawls a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
Website owners recognized the value of a high ranking and visibility in search engine results,[6] creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits Bruce Clay as one of the first people to popularize the term.[7]
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Flawed data in meta tags, such as those that were inaccurate or incomplete, created the potential for pages to be mischaracterized in irrelevant searches.[8][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[9] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[10]
By heavily relying on factors such as keyword density, which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[11] Since the success and popularity of a search engine are determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.
Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[12] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[13] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[14]
Some search engines have also reached out to the SEO industry and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[15][16] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[17] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
In 2015, it was reported that Google was developing and promoting mobile search as a key feature within future products. In response, many brands began to take a different approach to their Internet marketing strategies.[18]
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[19] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Page and Brin founded Google in 1998.[20] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[21] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link-building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focus on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[22]
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation.[23] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization and have shared their personal opinions.[24] Patents related to search engines can provide information to better understand search engines.[25] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[26]
In 2007, Google announced a campaign against paid links that transfer PageRank.[27] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any no follow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[28] As a result of this change, the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally, several solutions have been suggested that include the usage of iframes, Flash, and JavaScript.[29]
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[30] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts, and other content much sooner after publishing than before, Google Caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[31] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs, the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[32]
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system that punishes sites whose content is not unique.[33] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[34] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[35] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of "conversational search", where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words.[36] With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
In October 2019, Google announced they would start applying BERT models for English language search queries in the US. Bidirectional Encoder Representations from Transformers (BERT) was another attempt by Google to improve their natural language processing, but this time in order to better understand the search queries of their users.[37] In terms of search engine optimization, BERT intended to connect users more easily to relevant content and increase the quality of traffic coming to websites that are ranking in the Search Engine Results Page.
The leading search engines, such as Google, Bing, and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine-indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[38] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[39] in addition to their URL submission console.[40] Yahoo! formerly operated a paid submission service that guaranteed to crawl for a cost per click;[41] however, this practice was discontinued in 2009.
Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by search engines. The distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.[42]
Mobile devices are used for the majority of Google searches.[43] In November 2016, Google announced a major change to the way they are crawling websites and started to make their index mobile-first, which means the mobile version of a given website becomes the starting point for what Google includes in their index.[44] In May 2019, Google updated the rendering engine of their crawler to be the latest version of Chromium (74 at the time of the announcement). Google indicated that they would regularly update the Chromium rendering engine to the latest version.[45] In December 2019, Google began updating the User-Agent string of their crawler to reflect the latest Chrome version used by their rendering service. The delay was to allow webmasters time to update their code that responded to particular bot User-Agent strings. Google ran evaluations and felt confident the impact would be minor.[46]
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually <meta name="robots" content="noindex"> ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish to crawl. Pages typically prevented from being crawled include login-specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47] In 2020, Google sunsetted the standard (and open-sourced their code) and now treats it as a hint not a directive. To adequately ensure that pages are not indexed, a page-level robot's meta tag should be included.[48]
A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to important pages may improve its visibility. Page design makes users trust a site and want to stay once they find it. When people bounce off a site, it counts against the site and affects its credibility.[49] Writing content that includes frequently searched keyword phrases so as to be relevant to a wide variety of search queries will tend to increase traffic. Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's metadata, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL canonicalization of web pages accessible via multiple URLs, using the canonical link element[50] or via 301 redirects can help make sure links to different versions of the URL all count towards the page's link popularity score. These are known as incoming links, which point to the URL and can count towards the page link's popularity score, impacting the credibility of a website.[49]
SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). Search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods and the practitioners who employ them as either white hat SEO or black hat SEO.[51] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[52]
An SEO technique is considered a white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[15][16][53] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[54] although the two are not identical.
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines or involve deception. One black hat technique uses hidden text, either as text colored similar to the background, in an invisible div, or positioned off-screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking. Another category sometimes used is grey hat SEO. This is in between the black hat and white hat approaches, where the methods employed avoid the site being penalized but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.
Search engines may penalize sites they discover using black or grey hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for the use of deceptive practices.[55] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's search engine results page.[56]
SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, such as paid advertising through pay-per-click (PPC) campaigns, depending on the site operator's goals. Search engine marketing (SEM) is the practice of designing, running, and optimizing search engine ad campaigns. Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. SEM focuses on prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search.[57] A successful Internet marketing campaign may also depend upon building high-quality web pages to engage and persuade internet users, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[58][59] In November 2015, Google released a full 160-page version of its Search Quality Rating Guidelines to the public,[60] which revealed a shift in their focus towards "usefulness" and mobile local search. In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by StatCounter in October 2016, where they analyzed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device.[61] Google has been one of the companies that are utilizing the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and determine how user-friendly their websites are. The closer the keywords are together their ranking will improve based on key terms.[49]
SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantee and uncertainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[62] Search engines can change their algorithms, impacting a website's search engine ranking, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day.[63] It is considered a wise business practice for website operators to liberate themselves from dependence on search engine traffic.[64] In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[65] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[66] As of 2006, Google had an 85–90% market share in Germany.[67] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[67] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[68] That market share is achieved in a number of countries.
As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia, and the Czech Republic, where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.
Successful search optimization for international markets may require professional translation of web pages, registration of a domain name with a top level domain in the target market, and web hosting that provides a local IP address. Otherwise, the fundamental elements of search optimization are essentially the same, regardless of language.[67]
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[69][70]
In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit, and the amount of traffic to the site dropped by 70%. On March 16, 2007, the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[71][72]
{{cite web}}
: CS1 maint: multiple names: authors list (link)
This article needs additional citations for verification. (November 2009) |
Search analytics is the use of search data to investigate particular interactions among Web searchers, the search engine, or the content during searching episodes.[1] The resulting analysis and aggregation of search engine statistics can be used in search engine marketing (SEM) and search engine optimization (SEO). In other words, search analytics helps website owners understand and improve their performance on search engines based on the outcome. For example, identifying highly valuable site visitors[2] or understanding user intent.[3] Search analytics includes search volume trends and analysis, reverse searching (entering websites to see their keywords), keyword monitoring, search result and advertisement history, advertisement spending statistics, website comparisons, affiliate marketing statistics, multivariate ad testing, etc.[4]
Search analytics data can be collected in several ways. Search engines provide access to their own data with services such as Google Analytics,[5] Google Trends, and Google Insights. Third-party services must collect their data from ISP's, phoning home software, or from scraping search engines. Getting traffic statistics from ISP's and phone homes provides for broader reporting of web traffic in addition to search analytics. Services that perform keyword monitoring only scrape a limited set of search results, depending on their clients' needs. Services providing reverse search, however, must scrape a large set of keywords from the search engines, usually in the millions, to find the keywords that everyone is using.[6]
Since search results, especially advertisements, differ depending on where you are searching from, data collection methods have to account for geographic location. Keyword monitors do this more easily since they typically know what location their client is targeting. However, to get an exhaustive reverse search, several locations need to be scraped for the same keyword.
Search analytics accuracy depends on service being used, data collection method, and data freshness. Google releases its own data, but only in an aggregated way and often without assigning absolute values such as number of visitors to its graphs.[7] ISP logs and phone home methods are accurate for the population they sample, so sample size and demographics must be adequate to accurately represent the larger population. Scraping results can be highly accurate, especially when looking at the non-paid, organic search results. Paid results, from Google AdWords for example,[8] are often different for the same search depending on the time, geographic location, and history of searches from a particular computer. This means that scraping advertisers can be hit or miss.
Taking a look at Google Insights to gauge the popularity of these services shows that compared to searches for the term AdWords (Google's popular search ad system), use of search analytics services is still very low, around 1-25% as of Oct. 2009.[9] This could point to a large opportunity for the users and makers of search analytics given that services have existed since 2004 with several new services being started since.
{{cite journal}}
: CS1 maint: multiple names: authors list (link)
Part of a series on |
Internet marketing |
---|
Search engine marketing |
Display advertising |
Affiliate marketing |
Mobile advertising |
Social media optimization (SMO) is the use of online platforms to generate income or publicity to increase the awareness of a brand, event, product or service. Types of social media involved include RSS feeds, blogging sites, social bookmarking sites, social news websites, video sharing websites such as Youtube and social networking sites such as Facebook, Instagram, Tiktok and X(Twitter). SMO is similar to search engine optimization (SEO) in that the goal is to drive web traffic, and draw attention to a company or creator. SMO's focal point is on gaining organic links to social media content. In contrast, SEO's core is about reaching the top of the search engine hierarchy.[1] In general, social media optimization refers to optimizing a website and its content to encourage more users to use and share links to the website across social media and networking sites.[2]
SMO is used to strategically create online content ranging from well-written text to eye-catching digital photos or video clips that encourages and entices people to engage with a website. Users share this content, via its weblink, with social media contacts and friends. Common examples of social media engagement are "liking and commenting on posts, retweeting, embedding, sharing, and promoting content".[3] Social media optimization is also an effective way of implementing online reputation management (ORM), meaning that if someone posts bad reviews of a business, an SMO strategy can ensure that the negative feedback is not the first link to come up in a list of search engine results.[4]
In the 2010s, with social media sites overtaking TV as a source for news for young people, news organizations have become increasingly reliant on social media platforms for generating web traffic. Publishers such as The Economist employ large social media teams to optimize their online posts and maximize traffic,[5] while other major publishers now use advanced artificial intelligence (AI) technology to generate higher volumes of web traffic.[6]
Social media optimization is an increasingly important factor in search engine optimization, which is the process of designing a website in a way so that it has as high a ranking as possible on search engines. Search engines are increasingly utilizing the recommendations of users of social networks such as Reddit, Facebook, Tumblr, Twitter, YouTube, LinkedIn, Pinterest and Instagram to rank pages in the search engine result pages.[7] The implication is that when a webpage is shared or "liked" by a user on a social network, it counts as a "vote" for that webpage's quality. Thus, search engines can use such votes accordingly to properly ranked websites in search engine results pages. Furthermore, since it is more difficult to tip the scales or influence the search engines in this way, search engines are putting more stock into social search.[7] This, coupled with increasingly personalized search based on interests and location, has significantly increased the importance of a social media presence in search engine optimization. Due to personalized search results, location-based social media presences on websites such as Yelp, Google Places, Foursquare, and Yahoo! Local have become increasingly important. While social media optimization is related to search engine marketing, it differs in several ways. Primarily, SMO focuses on driving web traffic from sources other than search engines, though improved search engine ranking is also a benefit of successful social media optimization. Further, SMO is helpful to target particular geographic regions in order to target and reach potential customers. This helps in lead generation (finding new customers) and contributes to high conversion rates (i.e., converting previously uninterested individuals into people who are interested in a brand or organization).
Social media optimization is in many ways connected to the technique of viral marketing or "viral seeding" where word of mouth is created through the use of networking in social bookmarking, video and photo sharing websites. An effective SMO campaign can harness the power of viral marketing; for example, 80% of activity on Pinterest is generated through "repinning."[citation needed] Furthermore, by following social trends and utilizing alternative social networks, websites can retain existing followers while also attracting new ones. This allows businesses to build an online following and presence, all linking back to the company's website for increased traffic. For example, with an effective social bookmarking campaign, not only can website traffic be increased, but a site's rankings can also be increased. In a similar way, the engagement with blogs creates a similar result by sharing content through the use of RSS in the blogosphere. Social media optimization is considered an integral part of an online reputation management (ORM) or search engine reputation management (SERM) strategy for organizations or individuals who care about their online presence.[8] SMO is one of six key influencers that affect Social Commerce Construct (SCC). Online activities such as consumers' evaluations and advices on products and services constitute part of what creates a Social Commerce Construct (SCC).[citation needed]
Social media optimization is not limited to marketing and brand building. Increasingly, smart businesses are integrating social media participation as part of their knowledge management strategy (i.e., product/service development, recruiting, employee engagement and turnover, brand building, customer satisfaction and relations, business development and more). Additionally, social media optimization can be implemented to foster a community of the associated site, allowing for a healthy business-to-consumer (B2C) relationship.[9]
According to technologist Danny Sullivan, the term "social media optimization" was first used and described by marketer Rohit Bhargava[10][11] on his marketing blog in August 2006. In the same post, Bhargava established the five important rules of social media optimization. Bhargava believed that by following his rules, anyone could influence the levels of traffic and engagement on their site, increase popularity, and ensure that it ranks highly in search engine results. An additional 11 SMO rules have since been added to the list by other marketing contributors.
The 16 rules of SMO, according to one source, are as follows:[12]
Bhargava's initial five rules were more specifically designed to SMO, while the list is now much broader and addresses everything that can be done across different social media platforms. According to author and CEO of TopRank Online Marketing, Lee Odden, a Social Media Strategy is also necessary to ensure optimization. This is a similar concept to Bhargava's list of rules for SMO.
The Social Media Strategy may consider:[13]
According to Lon Safko and David K. Brake in The Social Media Bible, it is also important to act like a publisher by maintaining an effective organizational strategy, to have an original concept and unique "edge" that differentiates one's approach from competitors, and to experiment with new ideas if things do not work the first time.[4] If a business is blog-based, an effective method of SMO is using widgets that allow users to share content to their personal social media platforms. This will ultimately reach a wider target audience and drive more traffic to the original post. Blog widgets and plug-ins for post-sharing are most commonly linked to Facebook, LinkedIn and x.com. They occasionally also link to social media platforms such as Tumblr and Pinterest. Many sharing widgets also include user counters which indicate how many times the content has been liked and shared across different social media pages. This can influence whether or not new users will engage with the post, and also gives businesses an idea of what kind of posts are most successful at engaging audiences. By using relevant and trending keywords in titles and throughout blog posts, a business can also increase search engine optimization and the chances of their content of being read and shared by a large audience.[13] The root of effective SMO is the content that is being posted, so professional content creation tools can be very beneficial. These can include editing programs such as Photoshop, GIMP, Final Cut Pro, and Dreamweaver. Many websites also offer customization options such as different layouts to personalize a page and create a point of difference.[4]
With social media sites overtaking TV as a source for news for young people, news organizations have become increasingly reliant on social media platforms for generating traffic. A report by Reuters Institute for the Study of Journalism described how a 'second wave of disruption' had hit news organizations,[14] with publishers such as The Economist having to employ large social media teams to optimize their posts, and maximize traffic.[5] Within the context of the publishing industry, even professional fields are utilizing SMO. Because doctors want to maximize exposure to their research findings SMO has also found a place in the medical field.[15]
Today, 3.8 billion people globally are using some form of social media.[citation needed] People frequently obtain health-related information from online social media platforms like Twitter and Facebook. Healthcare professionals and scientists can communicate with other medical-counterparts to discuss research and findings through social media platforms. These platforms provide researchers with data sets and surveillance that help detect patterns and behavior in preventing, informing, and studying global disease; COVID-19. Additionally, researchers utilize SMO to reach and recruit hard-to-reach patients. SMO narrows specified demographics that filter necessary data in a given study.[citation needed]
Social media gaming is online gaming activity performed through social media sites with friends and online gaming activity that promotes social media interaction. Examples of the former include FarmVille, Clash of Clans, Clash Royale, FrontierVille, and Mafia Wars. In these games a player's social network is exploited to recruit additional players and allies. An example of the latter is Empire Avenue, a virtual stock exchange where players buy and sell shares of each other's social network worth. Nielsen Media Research estimates that, as of June 2010, social networking and playing online games account for about one-third of all online activity by Americans.[16]
Facebook has in recent years become a popular channel for advertising, alongside traditional forms such as television, radio, and print. With over 1 billion active users, and 50% of those users logging into their accounts every day[17] it is an important communication platform that businesses can utilize and optimize to promote their brand and drive traffic to their websites. There are three commonly used strategies to increase advertising reach on Facebook:
Improving effectiveness and increasing network size are organic approaches, while buying more reach is a paid approach which does not require any further action.[18] Most businesses will attempt an "organic" approach to gaining a significant following before considering a paid approach. Because Facebook requires a login, it is important that posts are public to ensure they will reach the widest possible audience. Posts that have been heavily shared and interacted with by users are displayed as 'highlighted posts' at the top of newsfeeds. In order to achieve this status, the posts need to be engaging, interesting, or useful. This can be achieved by being spontaneous, asking questions, addressing current events and issues, and optimizing trending hashtags and keywords. The more engagement a post receives, the further it will spread and the more likely it is to feature on first in search results.
Another organic approach to Facebook optimization is cross-linking different social platforms. By posting links to websites or social media sites in the profile 'about' section, it is possible to direct traffic and ultimately increase search engine optimization. Another option is to share links to relevant videos and blog posts.[13] Facebook Connect is a functionality that launched in 2008 to allow Facebook users to sign up to different websites, enter competitions, and access exclusive promotions by logging in with their existing Facebook account details. This is beneficial to users as they don't have to create a new login every time they want to sign up to a website, but also beneficial to businesses as Facebook users become more likely to share their content. Often the two are interlinked, where in order to access parts of a website, a user has to like or share certain things on their personal profile or invite a number of friends to like a page. This can lead to greater traffic flow to a website as it reaches a wider audience. Businesses have more opportunities to reach their target markets if they choose a paid approach to SMO. When Facebook users create an account, they are urged to fill out their personal details such as gender, age, location, education, current and previous employers, religious and political views, interests, and personal preferences such as movie and music tastes. Facebook then takes this information and allows advertisers to use it to determine how to best market themselves to users that they know will be interested in their product. This can also be known as micro-targeting. If a user clicks on a link to like a page, it will show up on their profile and newsfeed. This then feeds back into organic social media optimization, as friends of the user will see this and be encouraged to click on the page themselves. Although advertisers are buying mass reach, they are attracting a customer base with a genuine interest in their product. Once a customer base has been established through a paid approach, businesses will often run promotions and competitions to attract more organic followers.[12]
The number of businesses that use Facebook to advertise also holds significant relevance. in 2017, there were three million businesses that advertised on Facebook.[19] This makes Facebook the world's largest platform for social media advertising. What also holds importance is the amount of money leading businesses are spending on Facebook advertising alone. Procter & Gamble spend $60 million every year on Facebook advertising.[20] Other advertisers on Facebook include Microsoft, with a yearly spend of £35 million, Amazon, Nestle and American Express all with yearly expenditures above £25 million per year.
Furthermore, the number of small businesses advertising on Facebook is of relevance. This number has grown rapidly over the upcoming years and demonstrates how important social media advertising actually is. Currently 70% of the UK's small businesses use Facebook advertising.[21] This is a substantial number of advertisers. Almost half of the world's small businesses use social media marketing product of some sort. This demonstrates the impact that social media has had on the current digital marketing era.
ER (Engagement Rate) represents the activity of users specific for a certain profile on Facebook, Instagram, Tiktok or any other Social Media. A common way to calculate it is the following:
In the above formula followers is the total number of followers (friends, subscribers, etc), interactions stands for the number of interactions, such as likes, comments, personal messages, shares. The latter is averaged over the certain period of time, which should normally be short enough to ensure the variance in followers number is negligible during this period.
Social media optimization.
Part of a series on |
Internet marketing |
---|
Search engine marketing |
Display advertising |
Affiliate marketing |
Mobile advertising |
Search engine optimization (SEO) is the process of improving the quality and quantity of website traffic to a website or a web page from search engines.[1][2] SEO targets unpaid traffic (known as "natural" or "organic" results) rather than direct traffic or paid traffic. Unpaid traffic may originate from different kinds of searches, including image search, video search, academic search,[3] news search, and industry-specific vertical search engines.
As an Internet marketing strategy, SEO considers how search engines work, the computer-programmed algorithms that dictate search engine behavior, what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred by their targeted audience. SEO is performed because a website will receive more visitors from a search engine when websites rank higher on the search engine results page (SERP). These visitors can then potentially be converted into customers.[4]
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines, which would send a web crawler to crawl that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider/crawler crawls a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
Website owners recognized the value of a high ranking and visibility in search engine results,[6] creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits Bruce Clay as one of the first people to popularize the term.[7]
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Flawed data in meta tags, such as those that were inaccurate or incomplete, created the potential for pages to be mischaracterized in irrelevant searches.[8][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[9] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[10]
By heavily relying on factors such as keyword density, which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[11] Since the success and popularity of a search engine are determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.
Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[12] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[13] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[14]
Some search engines have also reached out to the SEO industry and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[15][16] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[17] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
In 2015, it was reported that Google was developing and promoting mobile search as a key feature within future products. In response, many brands began to take a different approach to their Internet marketing strategies.[18]
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[19] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Page and Brin founded Google in 1998.[20] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[21] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link-building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focus on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[22]
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation.[23] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization and have shared their personal opinions.[24] Patents related to search engines can provide information to better understand search engines.[25] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[26]
In 2007, Google announced a campaign against paid links that transfer PageRank.[27] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any no follow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[28] As a result of this change, the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally, several solutions have been suggested that include the usage of iframes, Flash, and JavaScript.[29]
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[30] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts, and other content much sooner after publishing than before, Google Caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[31] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs, the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[32]
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system that punishes sites whose content is not unique.[33] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[34] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[35] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of "conversational search", where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words.[36] With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
In October 2019, Google announced they would start applying BERT models for English language search queries in the US. Bidirectional Encoder Representations from Transformers (BERT) was another attempt by Google to improve their natural language processing, but this time in order to better understand the search queries of their users.[37] In terms of search engine optimization, BERT intended to connect users more easily to relevant content and increase the quality of traffic coming to websites that are ranking in the Search Engine Results Page.
The leading search engines, such as Google, Bing, and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine-indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[38] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[39] in addition to their URL submission console.[40] Yahoo! formerly operated a paid submission service that guaranteed to crawl for a cost per click;[41] however, this practice was discontinued in 2009.
Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by search engines. The distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.[42]
Mobile devices are used for the majority of Google searches.[43] In November 2016, Google announced a major change to the way they are crawling websites and started to make their index mobile-first, which means the mobile version of a given website becomes the starting point for what Google includes in their index.[44] In May 2019, Google updated the rendering engine of their crawler to be the latest version of Chromium (74 at the time of the announcement). Google indicated that they would regularly update the Chromium rendering engine to the latest version.[45] In December 2019, Google began updating the User-Agent string of their crawler to reflect the latest Chrome version used by their rendering service. The delay was to allow webmasters time to update their code that responded to particular bot User-Agent strings. Google ran evaluations and felt confident the impact would be minor.[46]
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually <meta name="robots" content="noindex"> ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish to crawl. Pages typically prevented from being crawled include login-specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47] In 2020, Google sunsetted the standard (and open-sourced their code) and now treats it as a hint not a directive. To adequately ensure that pages are not indexed, a page-level robot's meta tag should be included.[48]
A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to important pages may improve its visibility. Page design makes users trust a site and want to stay once they find it. When people bounce off a site, it counts against the site and affects its credibility.[49] Writing content that includes frequently searched keyword phrases so as to be relevant to a wide variety of search queries will tend to increase traffic. Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's metadata, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL canonicalization of web pages accessible via multiple URLs, using the canonical link element[50] or via 301 redirects can help make sure links to different versions of the URL all count towards the page's link popularity score. These are known as incoming links, which point to the URL and can count towards the page link's popularity score, impacting the credibility of a website.[49]
SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). Search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods and the practitioners who employ them as either white hat SEO or black hat SEO.[51] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[52]
An SEO technique is considered a white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[15][16][53] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[54] although the two are not identical.
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines or involve deception. One black hat technique uses hidden text, either as text colored similar to the background, in an invisible div, or positioned off-screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking. Another category sometimes used is grey hat SEO. This is in between the black hat and white hat approaches, where the methods employed avoid the site being penalized but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.
Search engines may penalize sites they discover using black or grey hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for the use of deceptive practices.[55] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's search engine results page.[56]
SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, such as paid advertising through pay-per-click (PPC) campaigns, depending on the site operator's goals. Search engine marketing (SEM) is the practice of designing, running, and optimizing search engine ad campaigns. Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. SEM focuses on prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search.[57] A successful Internet marketing campaign may also depend upon building high-quality web pages to engage and persuade internet users, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[58][59] In November 2015, Google released a full 160-page version of its Search Quality Rating Guidelines to the public,[60] which revealed a shift in their focus towards "usefulness" and mobile local search. In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by StatCounter in October 2016, where they analyzed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device.[61] Google has been one of the companies that are utilizing the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and determine how user-friendly their websites are. The closer the keywords are together their ranking will improve based on key terms.[49]
SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantee and uncertainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[62] Search engines can change their algorithms, impacting a website's search engine ranking, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day.[63] It is considered a wise business practice for website operators to liberate themselves from dependence on search engine traffic.[64] In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[65] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[66] As of 2006, Google had an 85–90% market share in Germany.[67] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[67] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[68] That market share is achieved in a number of countries.
As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia, and the Czech Republic, where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.
Successful search optimization for international markets may require professional translation of web pages, registration of a domain name with a top level domain in the target market, and web hosting that provides a local IP address. Otherwise, the fundamental elements of search optimization are essentially the same, regardless of language.[67]
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[69][70]
In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit, and the amount of traffic to the site dropped by 70%. On March 16, 2007, the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[71][72]
{{cite web}}
: CS1 maint: multiple names: authors list (link)
This article needs additional citations for verification. (November 2009) |
Search analytics is the use of search data to investigate particular interactions among Web searchers, the search engine, or the content during searching episodes.[1] The resulting analysis and aggregation of search engine statistics can be used in search engine marketing (SEM) and search engine optimization (SEO). In other words, search analytics helps website owners understand and improve their performance on search engines based on the outcome. For example, identifying highly valuable site visitors[2] or understanding user intent.[3] Search analytics includes search volume trends and analysis, reverse searching (entering websites to see their keywords), keyword monitoring, search result and advertisement history, advertisement spending statistics, website comparisons, affiliate marketing statistics, multivariate ad testing, etc.[4]
Search analytics data can be collected in several ways. Search engines provide access to their own data with services such as Google Analytics,[5] Google Trends, and Google Insights. Third-party services must collect their data from ISP's, phoning home software, or from scraping search engines. Getting traffic statistics from ISP's and phone homes provides for broader reporting of web traffic in addition to search analytics. Services that perform keyword monitoring only scrape a limited set of search results, depending on their clients' needs. Services providing reverse search, however, must scrape a large set of keywords from the search engines, usually in the millions, to find the keywords that everyone is using.[6]
Since search results, especially advertisements, differ depending on where you are searching from, data collection methods have to account for geographic location. Keyword monitors do this more easily since they typically know what location their client is targeting. However, to get an exhaustive reverse search, several locations need to be scraped for the same keyword.
Search analytics accuracy depends on service being used, data collection method, and data freshness. Google releases its own data, but only in an aggregated way and often without assigning absolute values such as number of visitors to its graphs.[7] ISP logs and phone home methods are accurate for the population they sample, so sample size and demographics must be adequate to accurately represent the larger population. Scraping results can be highly accurate, especially when looking at the non-paid, organic search results. Paid results, from Google AdWords for example,[8] are often different for the same search depending on the time, geographic location, and history of searches from a particular computer. This means that scraping advertisers can be hit or miss.
Taking a look at Google Insights to gauge the popularity of these services shows that compared to searches for the term AdWords (Google's popular search ad system), use of search analytics services is still very low, around 1-25% as of Oct. 2009.[9] This could point to a large opportunity for the users and makers of search analytics given that services have existed since 2004 with several new services being started since.
{{cite journal}}
: CS1 maint: multiple names: authors list (link)
Part of a series on |
Internet marketing |
---|
Search engine marketing |
Display advertising |
Affiliate marketing |
Mobile advertising |
Social media optimization (SMO) is the use of online platforms to generate income or publicity to increase the awareness of a brand, event, product or service. Types of social media involved include RSS feeds, blogging sites, social bookmarking sites, social news websites, video sharing websites such as Youtube and social networking sites such as Facebook, Instagram, Tiktok and X(Twitter). SMO is similar to search engine optimization (SEO) in that the goal is to drive web traffic, and draw attention to a company or creator. SMO's focal point is on gaining organic links to social media content. In contrast, SEO's core is about reaching the top of the search engine hierarchy.[1] In general, social media optimization refers to optimizing a website and its content to encourage more users to use and share links to the website across social media and networking sites.[2]
SMO is used to strategically create online content ranging from well-written text to eye-catching digital photos or video clips that encourages and entices people to engage with a website. Users share this content, via its weblink, with social media contacts and friends. Common examples of social media engagement are "liking and commenting on posts, retweeting, embedding, sharing, and promoting content".[3] Social media optimization is also an effective way of implementing online reputation management (ORM), meaning that if someone posts bad reviews of a business, an SMO strategy can ensure that the negative feedback is not the first link to come up in a list of search engine results.[4]
In the 2010s, with social media sites overtaking TV as a source for news for young people, news organizations have become increasingly reliant on social media platforms for generating web traffic. Publishers such as The Economist employ large social media teams to optimize their online posts and maximize traffic,[5] while other major publishers now use advanced artificial intelligence (AI) technology to generate higher volumes of web traffic.[6]
Social media optimization is an increasingly important factor in search engine optimization, which is the process of designing a website in a way so that it has as high a ranking as possible on search engines. Search engines are increasingly utilizing the recommendations of users of social networks such as Reddit, Facebook, Tumblr, Twitter, YouTube, LinkedIn, Pinterest and Instagram to rank pages in the search engine result pages.[7] The implication is that when a webpage is shared or "liked" by a user on a social network, it counts as a "vote" for that webpage's quality. Thus, search engines can use such votes accordingly to properly ranked websites in search engine results pages. Furthermore, since it is more difficult to tip the scales or influence the search engines in this way, search engines are putting more stock into social search.[7] This, coupled with increasingly personalized search based on interests and location, has significantly increased the importance of a social media presence in search engine optimization. Due to personalized search results, location-based social media presences on websites such as Yelp, Google Places, Foursquare, and Yahoo! Local have become increasingly important. While social media optimization is related to search engine marketing, it differs in several ways. Primarily, SMO focuses on driving web traffic from sources other than search engines, though improved search engine ranking is also a benefit of successful social media optimization. Further, SMO is helpful to target particular geographic regions in order to target and reach potential customers. This helps in lead generation (finding new customers) and contributes to high conversion rates (i.e., converting previously uninterested individuals into people who are interested in a brand or organization).
Social media optimization is in many ways connected to the technique of viral marketing or "viral seeding" where word of mouth is created through the use of networking in social bookmarking, video and photo sharing websites. An effective SMO campaign can harness the power of viral marketing; for example, 80% of activity on Pinterest is generated through "repinning."[citation needed] Furthermore, by following social trends and utilizing alternative social networks, websites can retain existing followers while also attracting new ones. This allows businesses to build an online following and presence, all linking back to the company's website for increased traffic. For example, with an effective social bookmarking campaign, not only can website traffic be increased, but a site's rankings can also be increased. In a similar way, the engagement with blogs creates a similar result by sharing content through the use of RSS in the blogosphere. Social media optimization is considered an integral part of an online reputation management (ORM) or search engine reputation management (SERM) strategy for organizations or individuals who care about their online presence.[8] SMO is one of six key influencers that affect Social Commerce Construct (SCC). Online activities such as consumers' evaluations and advices on products and services constitute part of what creates a Social Commerce Construct (SCC).[citation needed]
Social media optimization is not limited to marketing and brand building. Increasingly, smart businesses are integrating social media participation as part of their knowledge management strategy (i.e., product/service development, recruiting, employee engagement and turnover, brand building, customer satisfaction and relations, business development and more). Additionally, social media optimization can be implemented to foster a community of the associated site, allowing for a healthy business-to-consumer (B2C) relationship.[9]
According to technologist Danny Sullivan, the term "social media optimization" was first used and described by marketer Rohit Bhargava[10][11] on his marketing blog in August 2006. In the same post, Bhargava established the five important rules of social media optimization. Bhargava believed that by following his rules, anyone could influence the levels of traffic and engagement on their site, increase popularity, and ensure that it ranks highly in search engine results. An additional 11 SMO rules have since been added to the list by other marketing contributors.
The 16 rules of SMO, according to one source, are as follows:[12]
Bhargava's initial five rules were more specifically designed to SMO, while the list is now much broader and addresses everything that can be done across different social media platforms. According to author and CEO of TopRank Online Marketing, Lee Odden, a Social Media Strategy is also necessary to ensure optimization. This is a similar concept to Bhargava's list of rules for SMO.
The Social Media Strategy may consider:[13]
According to Lon Safko and David K. Brake in The Social Media Bible, it is also important to act like a publisher by maintaining an effective organizational strategy, to have an original concept and unique "edge" that differentiates one's approach from competitors, and to experiment with new ideas if things do not work the first time.[4] If a business is blog-based, an effective method of SMO is using widgets that allow users to share content to their personal social media platforms. This will ultimately reach a wider target audience and drive more traffic to the original post. Blog widgets and plug-ins for post-sharing are most commonly linked to Facebook, LinkedIn and x.com. They occasionally also link to social media platforms such as Tumblr and Pinterest. Many sharing widgets also include user counters which indicate how many times the content has been liked and shared across different social media pages. This can influence whether or not new users will engage with the post, and also gives businesses an idea of what kind of posts are most successful at engaging audiences. By using relevant and trending keywords in titles and throughout blog posts, a business can also increase search engine optimization and the chances of their content of being read and shared by a large audience.[13] The root of effective SMO is the content that is being posted, so professional content creation tools can be very beneficial. These can include editing programs such as Photoshop, GIMP, Final Cut Pro, and Dreamweaver. Many websites also offer customization options such as different layouts to personalize a page and create a point of difference.[4]
With social media sites overtaking TV as a source for news for young people, news organizations have become increasingly reliant on social media platforms for generating traffic. A report by Reuters Institute for the Study of Journalism described how a 'second wave of disruption' had hit news organizations,[14] with publishers such as The Economist having to employ large social media teams to optimize their posts, and maximize traffic.[5] Within the context of the publishing industry, even professional fields are utilizing SMO. Because doctors want to maximize exposure to their research findings SMO has also found a place in the medical field.[15]
Today, 3.8 billion people globally are using some form of social media.[citation needed] People frequently obtain health-related information from online social media platforms like Twitter and Facebook. Healthcare professionals and scientists can communicate with other medical-counterparts to discuss research and findings through social media platforms. These platforms provide researchers with data sets and surveillance that help detect patterns and behavior in preventing, informing, and studying global disease; COVID-19. Additionally, researchers utilize SMO to reach and recruit hard-to-reach patients. SMO narrows specified demographics that filter necessary data in a given study.[citation needed]
Social media gaming is online gaming activity performed through social media sites with friends and online gaming activity that promotes social media interaction. Examples of the former include FarmVille, Clash of Clans, Clash Royale, FrontierVille, and Mafia Wars. In these games a player's social network is exploited to recruit additional players and allies. An example of the latter is Empire Avenue, a virtual stock exchange where players buy and sell shares of each other's social network worth. Nielsen Media Research estimates that, as of June 2010, social networking and playing online games account for about one-third of all online activity by Americans.[16]
Facebook has in recent years become a popular channel for advertising, alongside traditional forms such as television, radio, and print. With over 1 billion active users, and 50% of those users logging into their accounts every day[17] it is an important communication platform that businesses can utilize and optimize to promote their brand and drive traffic to their websites. There are three commonly used strategies to increase advertising reach on Facebook:
Improving effectiveness and increasing network size are organic approaches, while buying more reach is a paid approach which does not require any further action.[18] Most businesses will attempt an "organic" approach to gaining a significant following before considering a paid approach. Because Facebook requires a login, it is important that posts are public to ensure they will reach the widest possible audience. Posts that have been heavily shared and interacted with by users are displayed as 'highlighted posts' at the top of newsfeeds. In order to achieve this status, the posts need to be engaging, interesting, or useful. This can be achieved by being spontaneous, asking questions, addressing current events and issues, and optimizing trending hashtags and keywords. The more engagement a post receives, the further it will spread and the more likely it is to feature on first in search results.
Another organic approach to Facebook optimization is cross-linking different social platforms. By posting links to websites or social media sites in the profile 'about' section, it is possible to direct traffic and ultimately increase search engine optimization. Another option is to share links to relevant videos and blog posts.[13] Facebook Connect is a functionality that launched in 2008 to allow Facebook users to sign up to different websites, enter competitions, and access exclusive promotions by logging in with their existing Facebook account details. This is beneficial to users as they don't have to create a new login every time they want to sign up to a website, but also beneficial to businesses as Facebook users become more likely to share their content. Often the two are interlinked, where in order to access parts of a website, a user has to like or share certain things on their personal profile or invite a number of friends to like a page. This can lead to greater traffic flow to a website as it reaches a wider audience. Businesses have more opportunities to reach their target markets if they choose a paid approach to SMO. When Facebook users create an account, they are urged to fill out their personal details such as gender, age, location, education, current and previous employers, religious and political views, interests, and personal preferences such as movie and music tastes. Facebook then takes this information and allows advertisers to use it to determine how to best market themselves to users that they know will be interested in their product. This can also be known as micro-targeting. If a user clicks on a link to like a page, it will show up on their profile and newsfeed. This then feeds back into organic social media optimization, as friends of the user will see this and be encouraged to click on the page themselves. Although advertisers are buying mass reach, they are attracting a customer base with a genuine interest in their product. Once a customer base has been established through a paid approach, businesses will often run promotions and competitions to attract more organic followers.[12]
The number of businesses that use Facebook to advertise also holds significant relevance. in 2017, there were three million businesses that advertised on Facebook.[19] This makes Facebook the world's largest platform for social media advertising. What also holds importance is the amount of money leading businesses are spending on Facebook advertising alone. Procter & Gamble spend $60 million every year on Facebook advertising.[20] Other advertisers on Facebook include Microsoft, with a yearly spend of £35 million, Amazon, Nestle and American Express all with yearly expenditures above £25 million per year.
Furthermore, the number of small businesses advertising on Facebook is of relevance. This number has grown rapidly over the upcoming years and demonstrates how important social media advertising actually is. Currently 70% of the UK's small businesses use Facebook advertising.[21] This is a substantial number of advertisers. Almost half of the world's small businesses use social media marketing product of some sort. This demonstrates the impact that social media has had on the current digital marketing era.
ER (Engagement Rate) represents the activity of users specific for a certain profile on Facebook, Instagram, Tiktok or any other Social Media. A common way to calculate it is the following:
In the above formula followers is the total number of followers (friends, subscribers, etc), interactions stands for the number of interactions, such as likes, comments, personal messages, shares. The latter is averaged over the certain period of time, which should normally be short enough to ensure the variance in followers number is negligible during this period.
Social media optimization.